Audiovisual asynchrony detection for speech and nonspeech signals

نویسندگان

  • Brianna L. Conrey
  • David B. Pisoni
چکیده

This study investigated the “intersensory temporal synchrony window” [1] for audiovisual (AV) signals. A speeded asynchrony detection task was used to measure each participant’s temporal synchrony window for speech and nonspeech signals over an 800-ms range of AV asynchronies. Across three sets of stimuli, the video-leading threshold for asynchrony detection was larger than the audio-leading threshold, replicating previous findings reported in the literature. Although the audio-leading threshold did not differ for any of the stimulus sets, the video-leading threshold was significantly larger for the point-light display (PLD) condition than for either the full-face (FF) or nonspeech (NS) conditions. In addition, a small but reliable phonotactic effect of visual intelligibility was found for the FF condition. High visual intelligibility words produced larger videoleading thresholds than low visual intelligibility words. Relationships with recent neurophysiological data on multisensory enhancement and convergence are discussed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Audiovisual asynchrony detection in human speech.

Combining information from the visual and auditory senses can greatly enhance intelligibility of natural speech. Integration of audiovisual speech signals is robust even when temporal offsets are present between the component signals. In the present study, we characterized the temporal integration window for speech and nonspeech stimuli with similar spectrotemporal structure to investigate to w...

متن کامل

Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals

Two experiments were conducted to examine the temporal limitations on the detection of asynchrony in auditory-visual (AV) signals. Each participant made asynchrony judgments about speech and nonspeech signals presented over an 800-ms range of AV onset asynchronies. Consistent with previous findings, all conditions revealed a wide window of several hundred milliseconds over which AV signals were...

متن کامل

Multimodal Sentence Intelligibility and the Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals: A First Report

The ability to perceive and understand visual-only speech and the benefit experienced from having both auditory and visual signals available during speech perception tasks varies widely in the normal-hearing population. At the present time, little is known about the underlying neural mechanisms responsible for this variability or the possible relationships between multisensory speech perception...

متن کامل

Neural Correlates of Temporal Complexity and Synchrony during Audiovisual Correspondence Detection

We often perceive real-life objects as multisensory cues through space and time. A key challenge for audiovisual integration is to match neural signals that not only originate from different sensory modalities but also that typically reach the observer at slightly different times. In humans, complex, unpredictable audiovisual streams lead to higher levels of perceptual coherence than predictabl...

متن کامل

Auditory, Visual and Audiovisual Speech Processing Streams in Superior Temporal Sulcus

The human superior temporal sulcus (STS) is responsive to visual and auditory information, including sounds and facial cues during speech recognition. We investigated the functional organization of STS with respect to modality-specific and multimodal speech representations. Twenty younger adult participants were instructed to perform an oddball detection task and were presented with auditory, v...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003